Multiclass Classification Based on Meta Probability Codes
نویسندگان
چکیده
This paper proposes a new approach to improve multiclass classi ̄cation performance by employing Stacked Generalization structure and One-Against-One decomposition strategy. The proposed approach encodes the outputs of all pairwise classi ̄ers by implicitly embedding twoclass discriminative information in a probabilistic manner. The encoded outputs, called Meta Probability Codes (MPCs), are interpreted as the projections of the original features. It is observed that MPC, compared to the original features, has more appropriate features for clustering. Based on MPC, we introduce a cluster-based multiclass classi ̄cation algorithm, called MPC-Clustering. The MPC-Clustering algorithm uses the proposed approach to project an original feature space to MPC, and then it employs a clustering scheme to cluster MPCs. Subsequently, it trains individual multiclass classi ̄ers on the produced clusters to complete the procedure of multiclass classi ̄er induction. The performance of the proposed algorithm is extensively evaluated on 20 datasets from the UCI machine learning database repository. The results imply that MPC-Clustering is quite e±cient with an improvement of 2.4% overall classi ̄cation rate compared to the state-of-the-art multiclass classi ̄ers.
منابع مشابه
FilterBoost: Regression and Classification on Large Datasets
We study boosting in the filtering setting, where the booster draws examples from an oracle instead of using a fixed training set and so may train efficiently on very large datasets. Our algorithm FilterBoost, which is based on a logistic regression technique proposed by Collins et al. (2002), requires fewer assumptions to achieve bounds equivalent to or better than previous work. Our proofs de...
متن کاملUsing Two-Class Classifiers for Multiclass Classification
The generalization from two-class classification to multiclass classification is not straightforward for discriminants which are not based on density estimation. Simple combining methods use voting, but this has the drawback of inconsequent labelings and ties. More advanced methods map the discriminant outputs to approximate posterior probability estimates and combine these, while other methods...
متن کاملApplying Multiple Complementary Neural Networks to Solve Multiclass Classification Problem
In this paper, a multiclass classification problem is solved using multiple complementary neural networks. Two techniques are applied to multiple complementary neural networks which are one-against-all and error correcting output codes. We experiment our proposed techniques using an extremely imbalance data set named glass from the UCI machine learning repository. It is found that the combinati...
متن کاملConstraint Classification for Multiclass Classification and Ranking
The constraint classification framework captures many flavors of multiclass classification including winner-take-all multiclass classification, multilabel classification and ranking. We present a meta-algorithm for learning in this framework that learns via a single linear classifier in high dimension. We discuss distribution independent as well as margin-based generalization bounds and present...
متن کاملReducing multiclass to binary by coupling probability estimates
This paper presents a method for obtaining class membership probability estimates for multiclass classification problems by coupling the probability estimates produced by binary classifiers. This is an extension for arbitrary code matrices of a method due to Hastie and Tibshirani for pairwise coupling of probability estimates. Experimental results with Boosted Naive Bayes show that our method p...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- IJPRAI
دوره 25 شماره
صفحات -
تاریخ انتشار 2011